Evaluation in K-12 Online Learning
Wednesday Morning Pre-Conference Workshops 8:00AM – 12:00PM
Tom Clark, TA Consulting, & Elizabeth Oyer, EvalSolutions Inc.

Session Information



Session Description

This session is designed for those undertaking or planning to undertake evaluation activities in online learning programs for improvement or accountability purposes. Participants will learn from each other and from experienced administrators and evaluators. The morning will be structured around a series of small-group round tables led by evaluators and administrators on topics of interest to the participants in their virtual school roles.
  • During the first round table session, participants will identify evaluation needs and formulate questions for their online learning program or that of a colleague.
  • In the second, they will define outcomes and discuss how to measure results.
  • In the third, they will discuss major unanswered questions about evaluation and engage in a question and answer session on them, and then will discuss barriers to implementation of evaluation in their own organizations. A summary of barriers identified will be shared with the iNACOL Research SIG at its Satursday Nov. 12 meeting.
Co-Facilitators Tom Clark (TA Consulting) and Elizabeth Oyer (EvalSolutions Inc) are joined by Mary Brabson (Indiana Online Academy), Ron Cozart (Georgia Virtual School), Ronda Eshleman (Indiana Online Academy) and Liz Pape (VHS Inc).

Session Blog (presenter and participant interaction):

http://vss2011eval.wordpress.com


Session Twitter Hashtag: #vss308p1




    Presentation Materials and Contributions


    A compiled Presentation Slides.pdf and Handouts.pdf will be posted here soon. In the meantime, a few of the materials are posted below.

    Evaluation Pre-Conference Handouts (all)

    Pre-Conference Presenter Slides (all)


    Pre-Conference on Evaluation in K-12 Online Learning

    Workgroup Process Notes - Nov, 9, 2011

    Organizational Mission:


    Shared elements of mission: quality education; online; offer alternatives for students
    Differences in mission: Differences in program type and purpose result in difference in missions - for example, full time vs part time program; charter vs noncharter, state virtual school, mission of the parent organiation, key student audiences that the school is intended to serve.

    Examples of questions raised about the missions and goals of online learning programs:
    Is there a separate mission for the virtual learning program in a public school district?
    How much attention do mission and goals need in an existing virtual school?

    Formulating evaluation questions:


    Cathy Cavanaugh - table summary:
    3 main areas that virtual schools need to look at in their evaluation questions
    --access to appropriate, quality instruction
    --Quallity courses and,instruction
    --fit with student needs

    Are students learning better than in the traditional model?
    What are student, parent and staff attitudes that need to be addressed?

    Name a key goal of your program:


    Cost effective alternative course options for students
    ensure curriculum quality
    Ensure student progress toward graduation
    Strengthen core academic learning
    Understanding student outcomes

    Write one Outcome Indicator for your program:


    Who: Minority students in ourPre-AP program
    What: will increase enrollment in AP classes
    Target: 10% each year, with target of 80% of minority students taking at least one AP class

    Comment - Elizabeth: can also look at it as closing the gap between the target population and the general population

    Who: At risk students (not passing state tests will take universal screening and intervention program)
    What: will close the existiing skill gap in reading and math compared to all students
    Target: 10% on state testing [is this reasonable - yes]

    Who: The program (a supplemental K-12 district program)
    What: will decrease student attrition by withdrawal or transfer
    Target: by 20% over two years

    Who: Students in our online learning program
    What:will increase their course completion rate
    Target: so that 50% of students are at or ahead of their time elapsed
    Comment - Elizabeth:: are there specific student populations of interest or concern, that would help you narrow down your outcome indicator? Comment -Tom:

    [From the pre-conference Blog]

    who: School administration
    what: willingness to experiment will increase
    target: 15% over prior year

    Participant comment about this outcome indicator: we'd like to see greater adoption of online learning, renewal of courses. The data needed would be enrollments, adoption patterns across field, attitudinal data, and more.

    Barriers to Implementation of evaluation in our online learning program


    Small staff - finding the time to do it.
    Staff - need feedback from others , access to a professional community to discuss
    Staff time or money - limited
    Our LMS does not generate all the data we wish it
    Lack of clear understaniding in formative evaluation; people jump to summative
    stakeholders unfamiliar with terminology, dif between formative and summative
    A process or vehicle for using formative evlauation for improvement
    Lack of broad consensus on evaluation standards
    Turnover - loss of organizational knowledge
    Political will to hear the truth
    Constantly changeing and evolving field